1,747 research outputs found

    An investigation into semi-automated 3D city modelling

    Get PDF
    Creating three dimensional digital representations of urban areas, also known as 3D city modelling, is essential in many applications, such as urban planning, radio frequency signal propagation, flight simulation and vehicle navigation, which are of increasing importance in modern society urban centres. The main aim of the thesis is the development of a semi-automated, innovative workflow for creating 3D city models using aerial photographs and LiDAR data collected from various airborne sensors. The complexity of this aim necessitates the development of an efficient and reliable way to progress from manually intensive operations to an increased level of automation. The proposed methodology exploits the combination of different datasets, also known as data fusion, to achieve reliable results in different study areas. Data fusion techniques are used to combine linear features, extracted from aerial photographs, with either LiDAR data or any other source available including Very Dense Digital Surface Models (VDDSMs). The research proposes a method which employs a semi automated technique for 3D city modelling by fusing LiDAR if available or VDDSMs with 3D linear features extracted from stereo pairs of photographs. The building detection and the generation of the building footprint is performed with the use of a plane fitting algorithm on the LiDAR or VDDSMs using conditions based on the slope of the roofs and the minimum size of the buildings. The initial building footprint is subsequently generalized using a simplification algorithm that enhances the orthogonality between the individual linear segments within a defined tolerance. The final refinement of the building outline is performed for each linear segment using the filtered stereo matched points with a least squares estimation. The digital reconstruction of the roof shapes is performed by implementing a least squares-plane fitting algorithm on the classified VDDSMs, which is restricted by the building outlines, the minimum size of the planes and the maximum height tolerance between adjacent 3D points. Subsequently neighbouring planes are merged using Boolean operations for generation of solid features. The results indicate very detailed building models. Various roof details such as dormers and chimneys are successfully reconstructed in most cases

    An investigation into semi-automated 3D city modelling

    Get PDF
    Creating three dimensional digital representations of urban areas, also known as 3D city modelling, is essential in many applications, such as urban planning, radio frequency signal propagation, flight simulation and vehicle navigation, which are of increasing importance in modern society urban centres. The main aim of the thesis is the development of a semi-automated, innovative workflow for creating 3D city models using aerial photographs and LiDAR data collected from various airborne sensors. The complexity of this aim necessitates the development of an efficient and reliable way to progress from manually intensive operations to an increased level of automation. The proposed methodology exploits the combination of different datasets, also known as data fusion, to achieve reliable results in different study areas. Data fusion techniques are used to combine linear features, extracted from aerial photographs, with either LiDAR data or any other source available including Very Dense Digital Surface Models (VDDSMs). The research proposes a method which employs a semi automated technique for 3D city modelling by fusing LiDAR if available or VDDSMs with 3D linear features extracted from stereo pairs of photographs. The building detection and the generation of the building footprint is performed with the use of a plane fitting algorithm on the LiDAR or VDDSMs using conditions based on the slope of the roofs and the minimum size of the buildings. The initial building footprint is subsequently generalized using a simplification algorithm that enhances the orthogonality between the individual linear segments within a defined tolerance. The final refinement of the building outline is performed for each linear segment using the filtered stereo matched points with a least squares estimation. The digital reconstruction of the roof shapes is performed by implementing a least squares-plane fitting algorithm on the classified VDDSMs, which is restricted by the building outlines, the minimum size of the planes and the maximum height tolerance between adjacent 3D points. Subsequently neighbouring planes are merged using Boolean operations for generation of solid features. The results indicate very detailed building models. Various roof details such as dormers and chimneys are successfully reconstructed in most cases

    An approach to produce a GIS database for road surface monitoring

    Get PDF
    Road Surface Monitoring (RSM) is the process of detecting the distress on paved or unpaved road surfaces. The primary aim of this process is to detect any distress (such as road surface cracks) at early stages in order to apply maintenance on time. Early detection of road cracks can assist maintenance before the repair costs becomes too high. Local authorities should have an effective and easy to use monitoring process in place across the road network to meet their obligations. The process of adding geographical identification metadata to the photos is called “Geo-tagging”. The proposed method in this work entails capturing GPS information when the photo is taken for the road surface distress, then attaching the photo to a map. The location disclosure in the act of geo-tagging of a photo provides qualities to the digital map. In that respect, a specific richness of the GIS dataset arises when they disclose the road surface distress photos. This paper proposes a system for establishing a GIS database consisting of geo-tagged photos for local authorities to automate the process of recording and reporting road surface distresses. This system is easy to use, cost-effective, deployable, and can be used effectively by local authorities

    Implementation of On-Line Data Reduction Algorithms in the CMS Endcap Preshower Data Concentrator Card

    Get PDF
    The CMS Endcap Preshower (ES) sub-detector comprises 4288 silicon sensors, each containing 32 strips. The data are transferred from the detector to the counting room via 1208 optical fibres running at 800Mbps. Each fibre carries data from 2, 3 or 4 sensors. For the readout of the Preshower, a VME-based system - the Endcap Preshower Data Concentrator Card (ES-DCC) is currently under development. The main objective of each readout board is to acquire on-detector data from up to 36 optical links, perform on-line data reduction (zero suppression) and pass the concentrated data to the CMS event builder. This document presents the conceptual design of the Reduction Algorithms as well as their implementation into the ES-DCC FPGAs. The algorithms implemented into the ES-DCC resulted in a reduction factor of ~20

    Energy Resolution Performance of the CMS Electromagnetic Calorimeter

    Get PDF
    The energy resolution performance of the CMS lead tungstate crystal electromagnetic calorimeter is presented. Measurements were made with an electron beam using a fully equipped supermodule of the calorimeter barrel. Results are given both for electrons incident on the centre of crystals and for electrons distributed uniformly over the calorimeter surface. The electron energy is reconstructed in matrices of 3 times 3 or 5 times 5 crystals centred on the crystal containing the maximum energy. Corrections for variations in the shower containment are applied in the case of uniform incidence. The resolution measured is consistent with the design goals

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Search for stop and higgsino production using diphoton Higgs boson decays

    Get PDF
    Results are presented of a search for a "natural" supersymmetry scenario with gauge mediated symmetry breaking. It is assumed that only the supersymmetric partners of the top-quark (stop) and the Higgs boson (higgsino) are accessible. Events are examined in which there are two photons forming a Higgs boson candidate, and at least two b-quark jets. In 19.7 inverse femtobarns of proton-proton collision data at sqrt(s) = 8 TeV, recorded in the CMS experiment, no evidence of a signal is found and lower limits at the 95% confidence level are set, excluding the stop mass below 360 to 410 GeV, depending on the higgsino mass
    corecore